recurrent kernel
Reservoir Computing meets Recurrent Kernels and Structured Transforms
Reservoir Computing is a class of simple yet efficient Recurrent Neural Networks where internal weights are fixed at random and only a linear output layer is trained. In the large size limit, such random neural networks have a deep connection with kernel methods. Our contributions are threefold: a) We rigorously establish the recurrent kernel limit of Reservoir Computing and prove its convergence.
- North America > United States (0.28)
- Europe > France > Île-de-France > Paris > Paris (0.04)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- (3 more...)
Reservoir Computing meets Recurrent Kernels and Structured Transforms
Reservoir Computing is a class of simple yet efficient Recurrent Neural Networks where internal weights are fixed at random and only a linear output layer is trained. In the large size limit, such random neural networks have a deep connection with kernel methods. Our contributions are threefold: a) We rigorously establish the recurrent kernel limit of Reservoir Computing and prove its convergence. The two proposed methods, Recurrent Kernel and Structured Reservoir Computing, turn out to be much faster and more memory-efficient than conventional Reservoir Computing.
Extension of Recurrent Kernels to different Reservoir Computing topologies
D'Inverno, Giuseppe Alessio, Dong, Jonathan
As kernel methods require the calculation of scalar products between all pairs of input points, recurrent Reservoir Computing is a machine learning technique used kernels offer an interesting alternative to Reservoir Computing for training Recurrent Neural Networks, which fixes the internal when the number of data points is limited. Additionally, recurrent weights of the network and trains only a linear layer, resulting kernels have been useful for theoretical studies, such in faster training times [9]. Its simplicity and effectiveness have as stability analysis in Reservoir Computing, as they provide a made it a popular choice for various tasks [12]. Additionally, deterministic limit with analytical expressions [2]. the random connections within Reservoir Computing networks Prior research on Recurrent Kernels has been mainly limited make them a useful framework for comparison with biological to vanilla Reservoir Computing and structured transforms.
- Europe > Switzerland > Vaud > Lausanne (0.04)
- Europe > Italy (0.04)
- Europe > Germany > North Rhine-Westphalia > Cologne Region > Bonn (0.04)
- Asia > Japan > Honshū > Chūbu > Ishikawa Prefecture > Kanazawa (0.04)
Reservoir Computing meets Recurrent Kernels and Structured Transforms
Dong, Jonathan, Ohana, Ruben, Rafayelyan, Mushegh, Krzakala, Florent
Reservoir Computing is a class of simple yet efficient Recurrent Neural Networks where internal weights are fixed at random and only a linear output layer is trained. In the large size limit, such random neural networks have a deep connection with kernel methods. Our contributions are threefold: a) We rigorously establish the recurrent kernel limit of Reservoir Computing and prove its convergence. b) We test our models on chaotic time series prediction, a classic but challenging benchmark in Reservoir Computing, and show how the Recurrent Kernel is competitive and computationally efficient when the number of data points remains moderate. c) When the number of samples is too large, we leverage the success of structured Random Features for kernel approximation by introducing Structured Reservoir Computing. The two proposed methods, Recurrent Kernel and Structured Reservoir Computing, turn out to be much faster and more memory-efficient than conventional Reservoir Computing.
- North America > United States (0.28)
- Europe > France > Île-de-France (0.14)
- North America > Canada (0.14)
- (2 more...)